Ideal Code Constrained Supervised Sparse Coding
نویسندگان
چکیده
In this paper, we proposed a novel sparse coding algorithm by using the class labels to constrain the learning of codebook and sparse code. We not only use the class label to train the classifier, but also use it to construct class conditional codewords to make the sparse code as discriminative as possible. We first construct ideal sparse codes with regarding to the class conditional codewords, and then constrain the learned sparse codes to the ideal sparse codes. We proposed a novel loss function composed of parse reconstruction error, classification error, and the ideal sparse code constrain error. This problem can be optimized by using the transitional KSVD method. In this way, we may learn a discriminative classifier and a discriminative codebook simultaneously. Moreover, using this codebook, the learnt the sparse codes of the same class are similar to each other. Finally, exhaustive experimental results show that the proposed algorithm outperforms other sparse coding methods.
منابع مشابه
Top-down saliency with Locality-constrained Contextual Sparse Coding
We propose a locality-constrained contextual sparse coding (LCCSC) for top-down saliency estimation where higher saliency scores are assigned to the image locations corresponding to the target object. Three locality constraints are integrated in to this novel sparse coding. First is the spatial or contextual locality constraint in which features from adjacent regions have similar code, second i...
متن کاملSupervised Deep Sparse Coding Networks
In this paper, we propose a novel multilayer sparse coding network capable of efficiently adapting its own regularization parameters to a given dataset. The network is trained end-to-end with a supervised task-driven learning algorithm via error backpropagation. During training, the network learns both the dictionaries and the regularization parameters of each sparse coding layer so that the re...
متن کاملLabel propagation based on local information with adaptive determination of number and degree of neighbor's similarity
In many practical applications of machine vision, a small number of samples are labeled and therefore, classification accuracy is low. On the other hand, labeling by humans is a very time consuming process, which requires a degree of proficiency. Semi-supervised learning algorithms may be used as a proper solution in these situations, where ε-neighborhood or k nearest neighborhood graphs are em...
متن کاملShift-Invariant Sparse Coding for Audio Classification
Sparse coding is an unsupervised learning algorithm that learns a succinct high-level representation of the inputs given only unlabeled data; it represents each input as a sparse linear combination of a set of basis functions. Originally applied to modeling the human visual cortex, sparse coding has also been shown to be useful for self-taught learning, in which the goal is to solve a supervise...
متن کاملShift-Invariance Sparse Coding for Audio Classification
Sparse coding is an unsupervised learning algorithm that learns a succinct high-level representation of the inputs given only unlabeled data; it represents each input as a sparse linear combination of a set of basis functions. Originally applied to modeling the human visual cortex, sparse coding has also been shown to be useful for self-taught learning, in which the goal is to solve a supervise...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- JCP
دوره 9 شماره
صفحات -
تاریخ انتشار 2014